home
***
CD-ROM
|
disk
|
FTP
|
other
***
search
/
InfoMagic Standards 1994 January
/
InfoMagic Standards - January 1994.iso
/
inet
/
nren
/
hpca
/
brock.txt
< prev
next >
Wrap
Text File
|
1991-04-17
|
17KB
|
316 lines
United States General Accounting Office
Testimony
GAO
Supercomputing in Industry
For Release on Delivery
Expected at 2:00 p.m. EST Tuesday, March 5, 1991
Statement for the record by
Jack L. Brock, Jr.,
Director Government Information and Financial Management Issues
Information Management and Technology Division
Before the Subcommittee on Science, Technology, and Space
Committee on Commerce, Science, and Transportation
United States Senate
GA/T-IMTEC-91-3
Messrs. Chairman and Members of the Committee and Subcommittee:
I am pleased to submit this statement for the record, as part of the
Committee's hearing on the proposed High Performance Computing
Act of 1991. The information contained in this statement reflects the
work that GAO has conducted to date on its review of how industries
are using supercomputers to improve productivity, reduce costs, and
develop new products. At your request, this work has focused on
four specific industries--oil, aerospace, automobile, and
pharmaceutical/chemical--and was limited to determining how these
industries use supercomputers and to citing reported benefits.
We developed this material through an extensive review of
published documents and through interviews with knowledgeable
representatives within the selected industries. In some cases, our
access to proprietary information was restricted. Since this statement
for the record reports on work still in progress, it may not fully
characterize industry use of supercomputers, or the full benefits
likely to accrue from such use.
BACKGROUND
A supercomputer, by its most basic definition, is the most powerful
computer available at a given time. While the term supercomputer
does not refer to a particular design or type of computer, the basic
design philosophy emphases vector or parallel processing,
[Footnote 1: Vector processing provides the capability of operating on
arrays, or vectors, of information simultaneously. With parallel
processing, multiple parts of a program are executed concurrently.
Massively parallel supercomputers are currently defined as those
having over 1,000 processors.]
aimed at achieving high levels of calculation very rapidly. Current
supercomputers, ranging in cost from $1 million to $30 million, are
capable of performing hundreds of millions or even billions of
calculations each second. Computations requiring many hours or days
on more conventional computers may be accomplished in a few
minutes or seconds on a supercomputer.
The unique computational power of supercomputers makes it
possible to find solutions to critical scientific and engineering
problems that cannot be dealt with satisfactorily by theoretical,
analytical, or experimental means. Scientists and engineers in many
fields-- including aerospace, petroleum exploration, automobile
design and testing, chemistry, materials science, and electronics--
emphasize the value of supercomputers in solving complex problems.
Much of this work centers around scientific visualization, a technique
allowing researchers to plot masses of raw data in three dimensions
to create visual images of objects or systems under study. This
enables researchers to model abstract data, allowing them to "see"
and thus comprehend more readily what the data reveal.
While still relatively limited in use, the number of supercomputers
has risen dramatically over the last decade. In the early l980s, most
of the 20 to 30 supercomputers in existence were operated by
government agencies for such purposes as weapons research and
weather modeling. Today about 280 supercomputers
[Footnote 2: This figure includes only high-end supercomputers such
as those manufactured by Cray Research, Inc. Including International
Business Machines (IBM) mainframes with vector facilities would
about double this number.]
are in use worldwide. Government (including defense-related
industry) remains the largest user, although private industry has
been the fastest growing user segment for the past few years and is
projected to remain so.
The industries we are examining enjoy a reputation for using
supercomputers to solve complex problems for which solutions might
otherwise be unattainable. Additionally, they represent the largest
group of supercomputer users. Over one-half of the 280
supercomputers in operation are being used for oil exploration;
aerospace modeling, testing, and development; automotive testing
and design; and chemical and pharmaceutical applications.
THE OIL INDUSTRY
The oil industry uses supercomputers to better determine the
location of oil reservoirs and to maximize the recovery of oil from
those reservoirs. Such applications have become increasingly
important because of the low probability of discovering large oil
fields in the continental United States. New oil fields are often small,
hard to find, and located in harsh environments making exploration
and production difficult. The oil industry uses two key
supercomputer applications, seismic data processing and reservoir
simulation, to aid in oil exploration and production. These
applications have saved money and increased oil production.
Seismic data processing increases the probability of determining
where oil reservoirs are located by analyzing large volumes of
seismic data
[Footnote 3: Seismic data are gathered by using sound-recording
devices to measure the speed at which vibrations travel through the
earth.]
and producing two and three- dimensional images of subsurface
geology. Through the study of these images, geologists can better
understand the characteristics of the area, and determine the
probability of oil being present. More accurately locating oil
reservoirs is important because the average cost of drilling a well is
estimated at about $5.5 million and can reach as high as $50 million.
Under the best of circumstances, most test wells do not result in
enough oil to make drilling cost-effective. Thus, avoiding drilling one
dry well can save millions of dollars. The industry representatives
who agreed to share cost estimates with us said that supercomputer
use in seismic data processing reduces the number of dry wells
drilled by about 10 percent, at a savings of hundreds of millions of
dollars over the last 5 years.
Reservoir simulation is used to increase the amount of oil that can be
extracted from a reservoir. Petroleum reservoirs are accumulations
of oil, water, and gas within the pores of rocks, located up to several
miles beneath the earth's surface. Reservoir modeling predicts the
flow of fluids in a reservoir so geologists can better determine how
oil should be extracted. Atlantic Richfield and Company (ARCO)
representatives estimate that reservoir simulation used for the oil
field at Prudhoe Bay, Alaska--the largest in production in the United
States--has resulted in increased oil production worth billions of
dollars.
THE AEROSPACE INDUSTRY
Engineers and researchers also use supercomputers to design,
develop, and test aerospace vehicles and related equipment. In
particular, computational fluid dynamics, which is dependent upon
supercomputing, enables engineers to simulate the flow of air and
fluid around proposed design shapes and then modify designs
accordingly. The simulations performed using this application are
valuable in eliminating some of the traditional wind tunnel tests
used in evaluating the aerodynamics of airplanes. Wind tunnels are
expensive to build and maintain, require costly construction of
physical models, and cannot reliably detect certain airflow
phenomena. Supercomputer-based design has thus resulted in
significant time and cost savings, as well as better designs, for the
aerospace industry.
Lockheed Aerospace used computational fluid dynamics on a
supercomputer to develop a computer model of the Advanced
Tactical Fighter for the U.S. Air Force. By using this approach,
Lockheed was able to display a full-vehicle computer model of the
fighter after approximately 5 hours of supercomputer processing
time. This approach allowed Lockheed to reduce the amount of wind-
tunnel testing by 80 hours, resulting in savings of about half a
million dollars.
The Boeing Aircraft Company used a Cray 1S-2000 supercomputer to
redesign the 17-year old 737-200 aircraft in the early 1980s. Aiming
to create a more fuel-efficient plane, Boeing decided to make the
body design longer and replace the engines with larger but more
efficient models. To determine the appropriate placement of these
new engines, Boeing used the supercomputer to simulate a wind-
tunnel test. The results of this simulation--which were much more
detailed than would have been available from an actual wind-tunnel
test--allowed the engineers to solve the engine placement problem
and create a more fuel-efficient aircraft.
THE AUTOMOBILE INDUSTRY
Automobile manufacturers have been using supercomputers
increasingly since 1985 as a design tool to make cars safer, lighter,
more economical, and better built. Further, the use of
supercomputers has allowed the automobile industry to achieve
these design improvements at significant savings.
One supercomputer application receiving increasing interest is
automobile crash-simulation. To meet federally mandated crash-
worthiness requirements, the automobile industry crashes large
numbers of pre-prototype vehicles head-on at 30 miles per hour into
rigid barriers. Vehicles for such tests can cost from $225,000 to
$750,000 each. Crash simulation using supercomputers provides
more precise engineering information, however, than is typically
available from actually crashing vehicles. In addition, using
supercomputers to perform this type of structural analysis reduces
the number of actual crash tests required by 20 to 30 percent, saving
the companies millions of dollars each year. Simulations such as this
were not practical prior to the development of vector
supercomputing because of the volume and complexity of data
involved.
Automobile companies credit supercomputers with improving
automobile design in other ways as well. For example, Chrysler
Corporation engineers use linear analysis and weight optimization
software on a Cray X-MP supercomputer to improve the design of its
vehicles. The resulting designs--which, according to a Chrysler
representative, would not have been practical without a
supercomputer--will allow Chrysler to achieve an annual reduction
of about $3 million in the cost of raw materials for manufacturing its
automobiles. In addition, one automobile's body was made 10
percent more rigid (which will improve ride and handling) and 11
percent lighter (which will improve fuel efficiency). According to the
Chrysler representative, this is typical of improvements that are
being achieved through the use of its supercomputer.
THE CHEMICAL AND PHARMACEUTICAL INDUSTRIES
Supercomputers play a growing role in the chemical and
pharmaceutical industries, although their use is still in its infancy.
From computer-assisted molecular design to synthetic materials
research, companies in these fields increasingly rely on
supercomputers to study critical design parameters and more
quickly and accurately interpret and refine experimental results.
Industry representative told us that, as a result, the use of
supercomputing will result in new discoveries that may not have
been possible otherwise.
The pharmaceutical industry is beginning to use supercomputers as a
research tool in developing new drugs. Development of a new drug
may require up to 30,000 compounds being synthesized and
screened, at a cost of about $5,000 per synthesis. As such, up to $150
million, before clinical testing and other costs, may he invested in
discovering a new drug, according to an E.I. du Pont de Nemours and
Company representative. Scientists can now eliminate some of this
testing by using simulation on a supercomputer. The supercomputer
analyzes and interprets complex data obtained from experimental
measurements. Then, using workstations, scientists can construct
three-dimensional models of the large, complex human proteins and
enzymes on the computer screen and rotate these images to gain
clues regarding biological activity and reactions to various potential
drugs.
Computer simulations are also being used in the chemical industry to
replace or enhance more traditional laboratory measurements. Du
Pont is currently working to develop replacements for
chlorofluorocarbons, compounds used as coolants for refrigerators
and air conditioners, and as cleansing agents for electronic
equipment. These compounds are generally thought to contribute to
the ozone depletion of the atmosphere and are being phased out. Du
Pont is designing a new process to produce substitute compounds in
a safe and cost- effective manner. These substitutes will be more
reactive in the atmosphere and subject to faster decomposition. Du
Pont is using a supercomputer to calculate the thermodynamic data
needed for developing the process. These calculations can be
completed by the supercomputer in a matter of days, at an
approximate cost of $2,000 to $5,000. Previously, such tests--using
experimental measurements conducted in a laboratory--would
require up to 3 months to conduct, at a cost of about $50,000. Both
the cost and time required would substantially limit the amount of
testing done.
BARRIERS TO GREATER USE OF SUPERCOMPUTERS
These examples demonstrate the significant advantages in terms of
cost savings, product improvements, and competitive opportunity
that can he realized through supercomputer use. However, such use
is still concentrated in only a few industries. Our industry contacts
identified significant, interrelated barriers that individually or
collectively, limit more widespread use of supercomputers.
Cost. Supercomputers are expensive. A supercomputer's cost of
between $1 million and $30 million does not include the cost of
software development, maintenance, or trained staff.
Cultural resistance. Simulation on supercomputers can not only
reduce the physical testing, measurement, and experimentation, but
can provide information that cannot otherwise be attained. For many
scientists and managers this represents a dramatic break with past
training, experience, generally accepted methods, or common
doctrine. For some, such a major shift in research methodology is
difficult to accept. These new methods are simply resisted or ignored.
Lack of application software. Supercomputers can be difficult to use.
For many industry applications, reliable software has not yet been
developed. This is particularly true for massively parallel
supercomputers.
Lack of trained scientists in supercomputing. Between 1970 and
1985, university students and professors performed little of their
research on supercomputers. For 15 years, industry hired students
from universities who did not bring supercomputing skills and
attitudes into their jobs. Now, as a result, many high-level scientists,
engineers, and managers in industry have little or no knowledge of
supercomputing.
In conclusion, our work to date suggests that the use of
supercomputers has made substantial contributions in key U.S.
industries. While our statement has referred to benefits related to
cost reduction and time savings, we believe that supercomputers will
increasingly be used to gain substantive competitive advantage.
Supercomputers offer the potential--still largely untapped--to
develop new and better products more quickly. This potential is just
beginning to be explored, as are ways around the barriers that
prevent supercomputers from being more fully exploited.